Method of test implementation and result measurement according to pass/fail criteria of protocol checkers and functional coverage models. This provides an easy way to track PCI architectural features and the scope (a range of random scenarios) of testing at chip, board, and system-level verification testing.
Detailed implementation of the verification environment also identified additional goals when I implemented it, including using transaction-oriented representation for functional test generators, result checkers, architectural reference models to increase the efficiency of writing and reusing tests, and integrated simulation and fuctional coverage analysis to drive closed-loop random testing and supply frequent and useful guidance to the verification team.
Verifying with HW/SW interfaces
In PCI subsystem verification, there are three critical hardware-software interfaces which must work properly, including a PCI device driver, a PCI basic input/output system (BIOS), and a host OS support for PCI autoconfiguration. Diagnostic programs use PCI driver and PCI BIOS routines to configure and perform memory and I/O bus cycles, which test host and device operations such as normal reads, writes, and interrupts.
I selected Avery Design Systems' (Andover, MA) Verilog-C Kernel (VCK) and "C" as the core tools for my testbench automation, coverage analysis, and HW-SW co-simulation environment. Some of the key verification technologies and methods supported by this environment included: parallel test stream automation, pseudo-random testing, enhanced Verilog HDL and C/C++ (VCI API) for transactor/test development, flexible HW/SW co-simulation, deterministic functional coverage analysis and protocol checking, and parallel simulations for scalable simulation performance.
The Avery VCK verification environment supports the verification collaborative infrastructure (VCI), which, in turn, supports a diverse range of distributed functional verification requirements including C-based test development, parallel distributed simulation, and HW/SW co-simulation of host and embedded systems architectures. The VCI capability is supported directly by VCK and through a ANSI C-based API for use with C/C++ programs and Verilog PLI library, extending any standard Verilog simulator for VCI support.
Coverage analysis and protocol checkers
In a large verification effort, especially one involving pseudo-random verification environment, I think automated coverage measurement is essential to understand the progress given the large set of testcases. Specifically, coverage analysis is used to evaluate the overall progress, weed out duplicate and ineffective testcases, identify areas which are not being tested, and identify tests for regression suites. Similairily, coverage measurement can be categorized into the two areas of code coverage (program-based) and functional coverage (transaction-specific).
Code coverage measures how much of the design has been exercised during simulation. A high level of activity indicates that all state machine controllers, datapath, and control have been exercised in at least one mode of operation. Code coverage tools measure cumulative simulations, and track state values and state transitions, data and control values, and line coverage.
Functional coverage methods include the use of monitors and assetion checkers. Monitors manage the verification process by collecting information on the occurrence of related combinations of temporal and state conditions and sequences during simulation. In this manner, a test or set of tests could be evaluated for its range of operation of architectural features of interest. Functional testing could be dynamically augmented with more pseudo-random cases based on coverage levels measured during simulation, or completely new tests could be added. Coverage analysis data may be maintained in a database enabling better aggregation, sorting, and analysis. Monitors provide more of an ad hoc approach to functional coverage analysis.
There is, however, still no way of guaranteeing that the deterministic and pseudo-random functional tests really exercise the design throughout its full range of expected system operation. A small subset of coverage monitors for PCI includes bus command types, target addresses, transaction processing types (posted, delayed, split, merged) and length, and termination and error conditions.
Protocol checkers verify that the system properties and assumptions involving protocols, architecture, and algorithms haven't been violated during simulation. Specifically, in complex concurrent systems, interaction of protocol rules can be subtle and problems can be easily missed. Pass/fail assertions are generated for each protocol rule definition. A protocol rule must support complex temporal relationships between signals and events. A small subset of PCI protocol rules include device select timing, maximum transaction length, data phase timing, and termination and retry.
Basics of pseudo-random testing
I found that one of the most important aspects of performing system-level compliance verification is the use of random testing to generate corner cases and stress tests. This is accomplished by systematically generating numerous combinations of highly related transaction properties. Pseudo-random test generation provides an effective approach to increasing functional coverage levels after manual-directed tests have been exhausted. Intelligent pseudo-random testing provides highly-efficient, random methods that don't require the long simulation times of exhaustive testing. Among these useful random methods, I found constrained random case, cyclical case, boundary case, and weighted case algorithms. Additionally, the ability to convey the inherent relationships between several architectural parameters improves the random case generation properties and improves, effectiveness while you're designing.
Pseudo-random test generation involves a three-step process. First, the architectural features are selected for a given test specification. Second, a template for the test is derived which supports the scope of the test content including random parameters. Third, a transactor is developed which implements the physical interface bewteen the test template and hardware model. When several random test generators are used concurrently in a simulation, I find many problems exposed which are caused by complex interactions between indepedent functions in the subsystem (an important element of random transactional testing for PCI subsytem functional and performance verification).
Verilog extensions for robust testing
Test generators, protocol checkers, and monitors are typically coded in Verilog hardware-description language (HDL). Verilog is a good, basic language for test development that supports concurrent sequential processes, datatypes supporting hardware implementation as well as general-purpose datatypes, and tasks and functions that improves modularity. However, compared to general-purpose programming langauges, Verilog lacks user-defined structures, macros, robust random functions, and interprocess synchronization primitives such as locks, mutexes, and semaphores. When it comes to testbench development, neither Verilog nor its general purpose programming language cousins provide efficient representation for developing efficient pseudo-random tests, temporal protocol checkers, and functional coverage monitors.
The Verilog HDL verification language extensions (VLE), a subset of VCK, provide the needed functions and datatypes enabling the development of manual-directed and automated pseudo-random directed tests, result checkers, functional coverage profilers, and protocol checkers. Fortunately, since VLE is an extension of Verilog, I find that the learning curve for the new functionality is quite quick.
The VLE is based on the Verilog user-defined task and function syntax found in the IEEE 1364-1995 Verilog HDL Standard. There are a number of important features for VLE:
- List functions provide support for lists that are new data types used to hold a collection of values. Lists are mutable and unsorted, and contain all supported Verilog and VLE datatypes including lists. The lists support range notation which provides specification of value ranges.
- Random variable functions provide functions to declare one or more random value generators into standard Verilog datatypes, inlcuding VLE's record types. Several random number generator algorithms are supported, including simple random, weighted distribution, boundary case, and cyclic reoccurrence which exhaustivley cycles all values once without repetition. Legal random value subsets are defined list functions. Complex relations are created using random variable groups and other special random variable relation functions. Random variables are assigned values by calling an update function.
- Signal history functions provide access to signal histories including variable values and change times. Query functions can access variable history at a specific simulation time or a value-change history index which looks up the previous N values.
- Protocol and temporal check tasks provide functions to check a temporal relationship of variables and expressions.
- Semaphore tasks provide synchronization functions to control access to shared resources with both suspend and no-wait options. Semaphores are counting semaphores which means that access to resources can be non-exclusive.
- Task enhancements enhance tasks by including support for call by reference and creating protected copies of task variables to support concurrent calls. Additionally, task arguments now include memory and record types.
- Record functions produce functions to declare and access records. A record is much like a struct in C or a container object in C++. Each field of a record can be any supported Verilog or VLE datatype, including records. Record datatypes provide robust user-defined structures in Verilog. An analogous $c_record that maps directly into a C-structure can also be created.
- Profile functions analyze the functional coverage of the simulation. Profile monitors track value changes to variables inlcuding state machines. Profiling functions support declaring profile monitors, setting legal values for variables, controlling profiler measurement updates which are both manual and automatic, declaring profile coverage equations, and reporting profiler results.
Profile coverage can be accessed dynamicly during simulation and incremental coverage is also supported.
The software-abstraction level
Although hardware-based testing is substantially improved through VLE-based testbench automation, system compliance must consider the interaction of hardware and software. The sharp rise in architectural complexity in ASICs and SOCs has given way to chip processors, firmware, and RTOSs (real time operating systems). System integration testing focuses on the debug of firmware and device drivers. These two things form the critical interface between software-based application programs, high-level protocol stacks, RTOSs, and the underlying hardware platform architecture of stand-alone and embedded systems.
System compliance verification involves developing tests, which operate at the software-abstraction level, providing better consistency with the system application's software data representation model and algorithms. Diagnostic test programs are run using a HW/SW co-simulation environment employing a HW-SW transactor based on the VCI API.
The overall viability of plug 'n play IP components in SOC design will be determined largely by the quality and thoroughness of their supporting verification environment. An ideal example is found in the PCI Local Bus standard. The PCI Sig and System Test Integrator's Forum have clearly and objectively specified requirements for PCI component and system-level compliance verification.
Dave Duxstad is an independent consultant specializing in ASIC architecture and design. Duxstad's ASIC design experience spans the range from consumer products to Cray super computers.
To voice an opinion on this or any other article in
Integrated System Design, please e-mail your comments to sdean@cmp.com
Send electronic versions of press releases to
news@isdmag.com
For more information about isdmag.com e-mail
webmaster@isdmag.com
Comments on our editorial are welcome.
Copyright © 2000
Integrated System Design
Magazine